DE eng

Search in the Catalogues and Directories

Hits 1 – 12 of 12

1
Sonic Shapes: Visualizing vocalexpression
In: http://social.cs.uiuc.edu/papers/pdfs/13-pietrowicz-icad.pdf (2013)
BASE
Show details
2
Widespread worry and the stock market
In: http://social.cs.uiuc.edu/people/gilbert/pub/icwsm10.worry.gilbert.pdf (2010)
BASE
Show details
3
Blogs Are Echo Chambers: Blogs Are Echo Chambers
In: http://social.cs.uiuc.edu/papers/pdfs/hicss09-echo-gilbert.pdf (2009)
BASE
Show details
4
CodeSaw: A Social Visualization of Distributed Software Development
In: http://social.cs.uiuc.edu/papers/pdfs/codesaw-interact2007.pdf (2007)
BASE
Show details
5
General Terms
In: http://social.cs.uiuc.edu/papers/pdfs/1022p-hailpern.pdf
BASE
Show details
6
ACES: Aphasia Emulation, Realism, and the Turing Test
In: http://social.cs.uiuc.edu/papers/pdfs/11-ASSETS-acesTuring.pdf
BASE
Show details
7
ACES: A Cross-Discipline Platform and Method for Communication and Language Research
In: http://social.cs.uiuc.edu/papers/pdfs/13-hailpern-cscw.pdf
BASE
Show details
8
Designing Visualizations to Facilitate Multisyllabic Speech with Children with Autism and Speech Delays
In: http://social.cs.uiuc.edu/papers/pdfs/12-hailpern-dis.pdf
BASE
Show details
9
Phonetic Shapes:
In: http://social.cs.uiuc.edu/papers/pdfs/pietrowicz_CHIEA12.pdf
BASE
Show details
10
Was It Worth It? Summarizing and Navigating User Reviews with Natural Language Methods
BASE
Show details
11
Proceedings of the Fourth International AAAI Conference on Weblogs and Social Media Widespread Worry and the Stock Market
In: http://www.aaai.org/ocs/index.php/ICWSM/ICWSM10/paper/viewFile/1513/1833/
BASE
Show details
12
A 3: HCI Coding Guideline for Research Using Video Annotation to Assess Behavior of Nonverbal Subjects with Computer-Based Intervention 8
In: http://social.cs.uiuc.edu/papers/pdfs/TASSETS_2009.pdf
Abstract: HCI studies assessing nonverbal individuals (especially those who do not communicate through traditional linguistic means: spoken, written, or sign) are a daunting undertaking. Without the use of directed tasks, interviews, questionnaires, or question-answer sessions, researchers must rely fully upon observation of behavior, and the categorization and quantification of the participant’s actions. This problem is compounded further by the lack of metrics to quantify the behavior of nonverbal subjects in computer-based intervention contexts. We present a set of dependent variables called A3 (pronounced A-Cubed) or Annotation for ASD Analysis, to assess the behavior of this demographic of users, specifically focusing on engagement and vocalization. This paper demonstrates how theory from multiple disciplines can be brought together to create a set of dependent variables, as well as demonstration of these variables, in an experimental context. Through an examination of the existing literature, and a detailed analysis of the current state of computer vision and speech detection, we present how computer automation may be integrated with the A3 guidelines to reduce coding time and potentially increase accuracy. We conclude by presenting how and where these variables can be used in multiple research areas and with varied target populations.
Keyword: General Terms
URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.168.5662
http://social.cs.uiuc.edu/papers/pdfs/TASSETS_2009.pdf
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
12
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern